Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Elicitation studies have become a popular method of participatory design. While traditionally used to examine unimodal gesture interactions, elicitation has started being used with other novel interaction modalities. Unfortunately, there has been no work that examines the impact of referent display on elicited interaction proposals. To address that concern this work provides a detailed comparison between two elicitation studies that were similar in design apart from the way that participants were prompted for interaction proposals (i.e., the referents). Based on this comparison the impact of referent display on speech and gesture interaction proposals are each discussed. The interaction proposals between these elicitation studies were not identical. Gesture proposals were the least impacted by referent display, showing high proposal similarity between the two works. Speech proposals were highly biased by text referents with proposals directly mirroring text-based referents an average of 69.36% of the time. In short, the way that referents are presented during elicitation studies can impact the resulting interaction proposals; however, the level of impact found is dependent on the modality of input elicited.more » « less
-
Interacting in stereoscopic head mounted displays can be difficult. There are not yet clear standards for how interactions in these environments should be performed. In virtual reality there are a number of well designed interaction techniques; however, augmented reality interaction techniques still need to be improved before they can be easily used. This dissertation covers work done towards understanding how users navigate and interact with virtual environments that are displayed in stereoscopic head-mounted displays. With this understanding, existing techniques from virtual reality devices ... (For more, see "View full record.")more » « less
-
null (Ed.)This research establishes a better understanding of the syntax choices in speech interactions and of how speech, gesture, and multimodal gesture and speech interactions are produced by users in unconstrained object manipulation environments using augmented reality. The work presents a multimodal elicitation study conducted with 24 participants. The canonical referents for translation, rotation, and scale were used along with some abstract referents (create, destroy, and select). In this study time windows for gesture and speech multimodal interactions are developed using the start and stop times of gestures and speech as well as the stoke times for gestures. While gestures commonly precede speech by 81 ms we find that the stroke of the gesture is commonly within 10 ms of the start of speech. Indicating that the information content of a gesture and its co-occurring speech are well aligned to each other. Lastly, the trends across the most common proposals for each modality are examined. Showing that the disagreement between proposals is often caused by a variation of hand posture or syntax. Allowing us to present aliasing recommendations to increase the percentage of users' natural interactions captured by future multimodal interactive systems.more » « less
An official website of the United States government

Full Text Available